Learning Sparsely Used Overcomplete Dictionaries via Alternating Minimization
نویسندگان
چکیده
We consider the problem of sparse coding, where each sample consists of a sparse linear combination of a set of dictionary atoms, and the task is to learn both the dictionary elements and the mixing coefficients. Alternating minimization is a popular heuristic for sparse coding, where the dictionary and the coefficients are estimated in alternate steps, keeping the other fixed. Typically, the coefficients are estimated via l1 minimization, keeping the dictionary fixed, and the dictionary is estimated through least squares, keeping the coefficients fixed. In this paper, we establish local linear convergence for this variant of alternating minimization and establish that the basin of attraction for the global optimum (corresponding to the true dictionary and the coefficients) is O ( 1/s ) , where s is the sparsity level in each sample and the dictionary satisfies RIP. Combined with the recent results of approximate dictionary estimation, this yields provable guarantees for exact recovery of both the dictionary elements and the coefficients, when the dictionary elements are incoherent.
منابع مشابه
Learning Sparsely Used Overcomplete Dictionaries
We consider the problem of learning sparsely used overcomplete dictionaries, where each observation is a sparse combination of elements from an unknown overcomplete dictionary. We establish exact recovery when the dictionary elements are mutually incoherent. Our method consists of a clustering-based initialization step, which provides an approximate estimate of the true dictionary with guarante...
متن کاملOvercomplete Dictionary Design by Empirical Risk Minimization
Recently, there have been a growing interest in application of sparse representation for inverse problems. Most studies concentrated in devising ways for sparsely representing a solution using a given prototype overcomplete dictionary. Very few studies have addressed the more challenging problem of construction of an optimal overcomplete dictionary, and even these were primarily devoted to the ...
متن کاملExact Recovery of Sparsely Used Overcomplete Dictionaries
We consider the problem of learning overcomplete dictionaries in the context of sparse coding, where each sample selects a sparse subset of dictionary elements. Our method consists of two stages, viz., initial estimation of the dictionary, and a clean-up phase involving estimation of the coefficient matrix, and re-estimation of the dictionary. We prove that our method exactly recovers both the ...
متن کاملEvolution-enhanced multiscale overcomplete dictionaries learning for image denoising
In this paper, a multiscale overcomplete dictionary learning approach is proposed for image denoising by exploiting the multiscale property and sparse representation of images. The images are firstly sparsely represented by a translation invariant dictionary and then the coefficients are denoised using some learned multiscale dictionaries. Dictionaries learning can be reduced to a non-convex l0...
متن کاملCritical Points Of An Autoencoder Can Provably Recover Sparsely Used Overcomplete Dictionaries
In Dictionary Learning one is trying to recover incoherent matrices A∗ ∈ Rn×h (typically overcomplete and whose columns are assumed to be normalized) and sparse vectors x∗ ∈ R with a small support of size h for some 0 < p < 1 while being given access to observations y ∈ R where y = A∗x∗. In this work we undertake a rigorous analysis of the possibility that dictionary learning could be performed...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM Journal on Optimization
دوره 26 شماره
صفحات -
تاریخ انتشار 2016